Chicago Tribune: Facebook did right in fight against hate

Subscribe Now Choose a package that suits your preferences.
Start Free Account Get access to 7 premium stories every month for FREE!
Already a Subscriber? Current print subscriber? Activate your complimentary Digital account.

Facebook’s terms of service are clear: “We do not allow hate speech on Facebook because it creates an environment of intimidation and exclusion, and in some cases may promote real-world violence.”

The social network giant draws the line on hate organizations and their leaders who attack individuals based on race, religious affiliation, nationality, ethnicity, gender, sex, sexual orientation or disability.

On Thursday, Facebook announced it had banned seven users it determined were purveyors of hate, including Louis Farrakhan, longtime leader of the Chicago-based Nation of Islam, and right-wing conspiracy theorist Alex Jones. The move was widely seen as a bold broadside against extremists by a Silicon Valley behemoth on the defensive for failing to police itself.

Call it a bold move. Or just plain sensible.

Banning Farrakhan, Jones and the others is a simple matter of enforcing Facebook’s rules against users who use their accounts as platforms for spreading hate and bigotry.

Chicagoans are familiar with Farrakhan’s vituperative rhetoric. He has called Jews “evil, Satanic,” according to the Anti-Defamation League, and in a speech last year said, “When you want something in this world, the Jew holds the door.” Jones, founder of the “Infowars” network, has long used Facebook to peddle his conspiracy theory that the Sandy Hook shooting that killed 20 children was a hoax.

Vile though as their words are, Farrakhan, Jones and others are within their First Amendment rights to spew venom from their own platforms. But when they disseminate on Facebook, they agree to abide by content policies set by the social network company. They broke the rules and got kicked off. That’s not government censorship, it’s an appropriate response to hate.

It’s also long overdue. Facebook’s previous approach relied on removing individual posts that violated its content rules. That didn’t stop the hate-mongering.

In New Zealand, the world saw what can happen when hate co-opts social media. A gunman live-streamed on Facebook as he sprayed gunfire at two mosques in Christchurch, killing 50 people. The 51st victim, a Turkish national, died of his injuries Friday.

Social media companies have a responsibility to patrol and regulate their venues. By stepping up, Facebook did the right thing.